Close

1. Identity statement
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Holder Codeibi 8JMKD3MGPEW34M/46T9EHH
Identifier8JMKD3MGPBW34M/3JMNT72
Repositorysid.inpe.br/sibgrapi/2015/06.19.21.00
Last Update2015:06.19.21.00.11 (UTC) administrator
Metadata Repositorysid.inpe.br/sibgrapi/2015/06.19.21.00.11
Metadata Last Update2022:06.14.00.08.09 (UTC) administrator
DOI10.1109/SIBGRAPI.2015.24
Citation KeyCamposDrumBast:2015:BaFeBa
TitleBMAX: a bag of features based method for image classification
FormatOn-line
Year2015
Access Date2024, May 02
Number of Files1
Size839 KiB
2. Context
Author1 Campos, Pedro Senna de
2 Drummond, Isabela Neves
3 Bastos, Guilherme Sousa
Affiliation1 UNIFEI
2 UNIFEI
3 UNIFEI
EditorPapa, Joćo Paulo
Sander, Pedro Vieira
Marroquim, Ricardo Guerra
Farrell, Ryan
e-Mail Addresspedrosennapsc@gmail.com
Conference NameConference on Graphics, Patterns and Images, 28 (SIBGRAPI)
Conference LocationSalvador, BA, Brazil
Date26-29 Aug. 2015
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeFull Paper
History (UTC)2015-06-19 21:00:11 :: pedrosennapsc@gmail.com -> administrator ::
2022-06-14 00:08:09 :: administrator -> :: 2015
3. Content and structure
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Version Typefinaldraft
KeywordsImage classification
bag-of-features
HMAX
low feature usage
AbstractThis work presents an image classification method based on bag of features, that needs less local features extracted for create a representative description of the image. The feature vector creation process of our approach is inspired in the cortex-like mechanisms used in "Hierarchical Model and X" proposed by Riesenhuber \& Poggio. Bag of Max Features - BMAX works with the distance from each visual word to its nearest feature found in the image, instead of occurrence frequency of each word. The motivation to reduce the amount of features used is to obtain a better relation between recognition rate and computational cost. We perform tests in three public images databases generally used as benchmark, and varying the quantity of features extracted. The proposed method can spend up to 60 times less local features than the standard bag of features, with estimate loss around 5\% considering recognition rate, that represents up to 17 times reduction in the running time.
Arrangement 1urlib.net > SDLA > Fonds > SIBGRAPI 2015 > BMAX: a bag...
Arrangement 2urlib.net > SDLA > Fonds > Full Index > BMAX: a bag...
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 19/06/2015 18:00 0.7 KiB 
4. Conditions of access and use
data URLhttp://urlib.net/ibi/8JMKD3MGPBW34M/3JMNT72
zipped data URLhttp://urlib.net/zip/8JMKD3MGPBW34M/3JMNT72
Languageen
Target FilePID3762887.pdf
User Grouppedrosennapsc@gmail.com
Visibilityshown
Update Permissionnot transferred
5. Allied materials
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPBW34M/3K24PF8
8JMKD3MGPEW34M/4742MCS
Citing Item Listsid.inpe.br/sibgrapi/2015/08.03.22.49 9
sid.inpe.br/banon/2001/03.30.15.38.24 1
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
6. Notes
Empty Fieldsarchivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination edition electronicmailaddress group isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder schedulinginformation secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url volume


Close